Cross-Domain Multitask Learning with Latent Probit ModelsSupplementary Material

نویسندگان

  • Shaobo Han
  • Xuejun Liao
  • Lawrence Carin
چکیده

We conducted symmetric multitask learning experiments on the landmine datasets (Xue et al., 2007). There are total of 19 tasks in the homogeneous space. We examine the performance of three methods on accuracy of label prediction: (i) cross-domain multitask learning using latent probit models, (ii) single task learning plus separate probit classifiers, (iii) simply pooling the data in all tasks and then learning a single probit classifier.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Cross-Domain Multitask Learning with Latent Probit Models

Learning multiple tasks across heterogeneous domains is a challenging problem since the feature space may not be the same for different tasks. We assume the data in multiple tasks are generated from a latent common domain via sparse domain transforms and propose a latent probit model (LPM) to jointly learn the domain transforms, and a probit classifier shared in the common domain. To learn mean...

متن کامل

Bayesian Multitask Learning with Latent Hierarchies

We learn multiple hypotheses for related tasks under a latent hierarchical relationship between tasks. We exploit the intuition that for domain adaptation, we wish to share classifier structure, but for multitask learning, we wish to share covariance structure. Our hierarchical model is seen to subsume several previously proposed multitask learning models and performs well on three distinct rea...

متن کامل

Using Both Latent and Supervised Shared Topics for Multitask Learning

This paper introduces two new frameworks, Doubly Supervised Latent Dirichlet Allocation (DSLDA) and its non-parametric variation (NP-DSLDA), that integrate two different types of supervision: topic labels and category labels. This approach is particularly useful for multitask learning, in which both latent and supervised topics are shared between multiple categories. Experimental results on bot...

متن کامل

Co-Clustering for Multitask Learning

This paper presents a new multitask learning framework that learns a shared representation among the tasks, incorporating both task and feature clusters. The jointlyinduced clusters yield a shared latent subspace where task relationships are learned more effectively and more generally than in state-of-the-art multitask learning methods. The proposed general framework enables the derivation of m...

متن کامل

Flexible Modeling of Latent Task Structures in Multitask Learning

Multitask learning algorithms are typically designed assuming some fixed, a priori known latent structure shared by all the tasks. However, it is usually unclear what type of latent task structure is the most appropriate for a given multitask learning problem. Ideally, the “right” latent task structure should be learned in a data-driven manner. We present a flexible, nonparametric Bayesian mode...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013